skip to main content


Search for: All records

Creators/Authors contains: "Patra, Abani"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Hybrid rocket motors with paraffin-based fuels are of interest due to higher regression rates compared to other polymers. During paraffin combustion, a liquid layer forms on the fuel surface that, together with shearing forces from the oxidizer flow, results in the formation of instabilities at the fuel-oxidizer interface. These instabilities lead to the formation and entrainment of heterogeneous sized liquid droplets into the main flow and the combusting droplets result in higher motor output. The atomization process begins with droplet formation and ends with droplet pinch-off. The goal of this paper is to conduct an uncertainty quantification (UQ) analysis of the pinch-off process characterized by a pinch-off volume ($V_{po}$) and time ($t_{po}$). We study these quantities of interest (QoIs) in the context of a slab burner setup. We have developed a computationally expensive mathematical model that describes droplet formation under external forcing and trained an inexpensive Gaussian Process surrogate of the model to facilitate UQ. We use the pinch-off surrogate to forward propagate uncertainty of the model inputs to the QoIs and conduct two studies: one with gravity present and one without gravity effects. After forward-propagating the uncertainty of the inputs using the surrogate, we concluded that both QoIs have right-skewed distributions, corresponding to larger probability densities towards smaller pinch-off volumes and times. Specifically, for the pinch-off times, the resulting distributions reflect the effect of gravity acting against droplet formation, resulting in longer pinch-off times compared to the case where there is no gravity. 
    more » « less
    Free, publicly-accessible full text available January 4, 2025
  2. Abstract Circulating tumor cell clusters (CTCCs) are rare cellular events found in the blood stream of metastatic tumor patients. Despite their scarcity, they represent an increased risk for metastasis. Label-free detection methods of these events remain primarily limited to in vitro microfluidic platforms. Here, we expand on the use of confocal backscatter and fluorescence flow cytometry (BSFC) for label-free detection of CTCCs in whole blood using machine learning for peak detection/classification. BSFC uses a custom-built flow cytometer with three excitation wavelengths (405 nm, 488 nm, and 633 nm) and five detectors to detect CTCCs in whole blood based on corresponding scattering and fluorescence signals. In this study, detection of CTCC-associated GFP fluorescence is used as the ground truth to assess the accuracy of endogenous back-scattered light-based CTCC detection in whole blood. Using a machine learning model for peak detection/classification, we demonstrated that the combined use of backscattered signals at the three wavelengths enable detection of ~ 93% of all CTCCs larger than two cells with a purity of > 82% and an overall accuracy of > 95%. The high level of performance established through BSFC and machine learning demonstrates the potential for label-free detection and monitoring of CTCCs in whole blood. Further developments of label-free BSFC to enhance throughput could lead to important applications in the isolation of CTCCs in whole blood with minimal disruption and ultimately their detection in vivo. 
    more » « less
  3. Abstract In Southeast Greenland, summer melt and high winter snowfall rates give rise to firn aquifers: vast stores of meltwater buried beneath the ice-sheet surface. Previous detailed studies of a single Greenland firn aquifer site suggest that the water drains into crevasses, but this is not known at a regional scale. We develop and use a tool in Ghub, an online gateway of shared datasets, tools and supercomputing resources for glaciology, to identify crevasses from elevation data collected by NASA's Airborne Topographic Mapper across 29000 km 2 of Southeast Greenland. We find crevasses within 3 km of the previously mapped downglacier boundary of the firn aquifer at 20 of 25 flightline crossings. Our data suggest that crevasses widen until they reach the downglacier boundary of the firn aquifer, implying that crevasses collect firn-aquifer water, but we did not find this trend with statistical significance. The median crevasse width, 27 meters, implies an aspect ratio consistent with the crevasses reaching the bed. Our results support the idea that most water in Southeast Greenland firn aquifers drains through crevasses. Less common fates are discharge at the ice-sheet surface (3 of 25 sites) and refreezing at the aquifer bottom (1 of 25 sites). 
    more » « less
  4. Probabilistic hazard assessments for studying overland pyroclastic flows or atmospheric ash clouds under short timelines of an evolving crisis, require using the best science available unhampered by complicated and slow manual workflows. Although deterministic mathematical models are available, in most cases, parameters and initial conditions for the equations are usually only known within a prescribed range of uncertainty. For the construction of probabilistic hazard assessments, accurate outputs and propagation of the inherent input uncertainty to quantities of interest are needed to estimate necessary probabilities based on numerous runs of the underlying deterministic model. Characterizing the uncertainty in system states due to parametric and input uncertainty, simultaneously, requires using ensemble based methods to explore the full parameter and input spaces. Complex tasks, such as running thousands of instances of a deterministic model with parameter and input uncertainty require a High Performance Computing infrastructure and skilled personnel that may not be readily available to the policy makers responsible for making informed risk mitigation decisions. For efficiency, programming tasks required for executing ensemble simulations need to run in parallel, leading to twin computational challenges of managing large amounts of data and performing CPU intensive processing. The resulting flow of work requires complex sequences of tasks, interactions, and exchanges of data, hence the automatic management of these workflows are essential. Here we discuss a computer infrastructure, methodology and tools which enable scientists and other members of the volcanology research community to develop workflows for construction of probabilistic hazard maps using remotely accessed computing through a web portal.

     
    more » « less
  5. null (Ed.)
    Effective volcanic hazard management in regions where populations live in close proximity to persistent volcanic activity involves understanding the dynamic nature of hazards, and associated risk. Emphasis until now has been placed on identification and forecasting of the escalation phase of activity, in order to provide adequate warning of what might be to come. However, understanding eruption hiatus and post-eruption unrest hazards, or how to quantify residual hazard after the end of an eruption, is also important and often key to timely post-eruption recovery. Unfortunately, in many cases when the level of activity lessens, the hazards, although reduced, do not necessarily cease altogether. This is due to both the imprecise nature of determination of the “end” of an eruptive phase as well as to the possibility that post-eruption hazardous processes may continue to occur. An example of the latter is continued dome collapse hazard from lava domes which have ceased to grow, or sector collapse of parts of volcanic edifices, including lava dome complexes. We present a new probabilistic model for forecasting pyroclastic density currents (PDCs) from lava dome collapse that takes into account the heavy-tailed distribution of the lengths of eruptive phases, the periods of quiescence, and the forecast window of interest. In the hazard analysis, we also consider probabilistic scenario models describing the flow’s volume and initial direction. Further, with the use of statistical emulators, we combine these models with physics-based simulations of PDCs at Soufrière Hills Volcano to produce a series of probabilistic hazard maps for flow inundation over 5, 10, and 20 year periods. The development and application of this assessment approach is the first of its kind for the quantification of periods of diminished volcanic activity. As such, it offers evidence-based guidance for dome collapse hazards that can be used to inform decision-making around provisions of access and reoccupation in areas around volcanoes that are becoming less active over time. 
    more » « less
  6. null (Ed.)
  7. Abstract. We detail a new prediction-oriented procedure aimed at volcanic hazardassessment based on geophysical mass flow models constrained withheterogeneous and poorly defined data. Our method relies on an itemizedapplication of the empirical falsification principle over an arbitrarily wideenvelope of possible input conditions. We thus provide a first step towards aobjective and partially automated experimental design construction. Inparticular, instead of fully calibrating model inputs on past observations,we create and explore more general requirements of consistency, and then weseparately use each piece of empirical data to remove those input values thatare not compatible with it. Hence, partial solutions are defined to the inverseproblem. This has several advantages compared to a traditionally posedinverse problem: (i) the potentially nonempty inverse images of partialsolutions of multiple possible forward models characterize the solutions tothe inverse problem; (ii) the partial solutions can provide hazard estimatesunder weaker constraints, potentially including extreme cases that areimportant for hazard analysis; (iii) if multiple models are applicable,specific performance scores against each piece of empirical information canbe calculated. We apply our procedure to the case study of the Atenquiquevolcaniclastic debris flow, which occurred on the flanks of Nevado de Colimavolcano (Mexico), 1955. We adopt and compare three depth-averaged modelscurrently implemented in the TITAN2D solver, available from https://vhub.org(Version 4.0.0 – last access: 23 June 2016). The associated inverse problemis not well-posed if approached in a traditional way. We show that our procedurecan extract valuable information for hazard assessment, allowing the explorationof the impact of synthetic flows that are similar to those that occurred in thepast but different in plausible ways. The implementation of multiple models isthus a crucial aspect of our approach, as they can allow the covering of otherplausible flows. We also observe that model selection is inherently linked tothe inversion problem.

     
    more » « less